Hardware implementation of radial-basis neural networks with Gaussian activation functions on FPGA

نویسندگان

چکیده

Abstract This article introduces a method for realizing the Gaussian activation function of radial-basis (RBF) neural networks with their hardware implementation on field-programmable gaits area (FPGAs). The results modeling FPGA chips different families have been presented. RBF various topologies synthesized and investigated. component implemented by this algorithm is an network four neurons latent layer one neuron sigmoid using 16-bit numbers fixed point, which took 1193 logic matrix gate (LUTs—LookUpTable). Each hidden designed as separate computing unit. speed total delay combination scheme block was 101.579 ns. functions occupies 106 LUTs, 29.33 absolute error ± 0.005. Spartan 3 family has used to get these results. Modeling other series also introduced in article. Hardware such allows them be real-time control systems high-speed objects.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Recurrent Neural Networks Hardware Implementation on FPGA

Recurrent Neural Networks (RNNs) have the ability to retain memory and learn data sequences. Due to the recurrent nature of RNNs, it is sometimes hard to parallelize all its computations on conventional hardware. CPUs do not currently offer large parallelism, while GPUs offer limited parallelism due to sequential components of RNN models. In this paper we present a hardware implementation of Lo...

متن کامل

Novel Radial Basis Function Neural Networks based on Probabilistic Evolutionary and Gaussian Mixture Model for Satellites Optimum Selection

In this study, two novel learning algorithms have been applied on Radial Basis Function Neural Network (RBFNN) to approximate the functions with high non-linear order. The Probabilistic Evolutionary (PE) and Gaussian Mixture Model (GMM) techniques are proposed to significantly minimize the error functions. The main idea is concerning the various strategies to optimize the procedure of Gradient ...

متن کامل

Stable Computations with Gaussian Radial Basis Functions

Radial basis function (RBF) approximation is an extremely powerful tool for representing smooth functions in non-trivial geometries, since the method is meshfree and can be spectrally accurate. A perceived practical obstacle is that the interpolation matrix becomes increasingly illconditioned as the RBF shape parameter becomes small, corresponding to flat RBFs. Two stable approaches that overco...

متن کامل

Complexity of Gaussian-radial-basis networks approximating smooth functions

Complexity of Gaussian radial-basis-function networks, with varying widths, is investigated. Upper bounds on rates of decrease of approximation errors with increasing number of hidden units are derived. Bounds are in terms of norms measuring smoothness (Bessel and Sobolev norms) multiplied by explicitly given functions a(r, d) of the number of variables d and degree of smoothness r. Estimates a...

متن کامل

Artificial Neural Networks Processor - A Hardware Implementation Using a FPGA

Several implementations of Artificial Neural Networks have been reported in scientific papers. Nevertheless, these implementations do not allow the direct use of off-line trained networks because of the much lower precision when compared with the software solutions where they are prepared or modifications in the activation function. In the present work a hardware solution called Artificial Neur...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neural Computing and Applications

سال: 2021

ISSN: ['0941-0643', '1433-3058']

DOI: https://doi.org/10.1007/s00521-021-05706-3